Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Shutterstock/Chinnapong

Facebook urges against 'punitive' fines for firms who breach Government's new online safety laws

The government published submissions on the proposed laws today.

FACEBOOK HAS CALLED on the Government not to introduce “punitive” fines for social media companies who will be found in breach of proposed online safety laws.

In a submission to the Department of Communications, the social media giant also warned about the potential free speech implications of the proposed Online Safety Act.

The legislation proposes the creation of an Online Safety Commissioner, who will review how companies moderate their content, order companies to remove individual pieces of content within a certain timeframe, and impose fines on those who fail to comply.

Under new proposals, online companies could be required to implement an Online Safety Code, which will outline how they are keeping their users safe online.

The code could also see companies required to provide clear information to their users about how material posted on their platforms can be removed. 

Earlier this year, the government carried out a six-week public consultation into the proposals, which asked how ‘harmful’ content should be defined, the kind of sanctions that should be imposed on companies, and which online platforms should be included.

The Department received 84 submissions from individuals, government bodies, charities, and companies including Facebook, Google, Vodafone, RTÉ, TG4 and Independent News & Media, which have been published today.

Freedom of speech

In a 40-page submission, Facebook said that it supported the Government’s aim to deal with so-called ‘harmful’ content online, adding that it welcomed the introduction of legislation that would streamline the process for having ‘harmful’ content removed.

However, the company cautioned that this should be done in a way that did not curtail “legitimate freedom of speech and freedom of expression”.

It suggested that any code relating to the removal of content online should encourage users to contact social media companies with a complaint first, which could later be escalated to a regulator, saying that such a process would be more efficient.

Facebook also said that in cases where the service provider disagreed that certain content was harmful, the regulator could issue a decision requiring the removal of the content, but it added that this should be open to be challenged in court.

“This is particularly important given the free speech implications that may be at play in cases where the regulator is seeking to demand the removal of content that is deemed harmful but may not be unlawful,” the submission reads.

The social media firm said it recognised that a regulator should able to impose sanctions on companies that did not adhere to the proposed code, but said that such sanctions should only be applied to companies that regularly fail to comply with orders to remove content.

It further submitted that any sanctions should bear in mind provisions in the Constitution, citing Article 34.1 and Article 38.1, which refer to the course of criminal proceedings in law and the administration of justice in court.

The company also called for any “administrative fines” to be proportionate, and for mitigation factors to be tied to any system of sanctions, adding:

It is again important that such fines are not so high as to essentially be punitive in nature, which would be indicative of the matter being a criminal matter best resolved by the courts.

Public interest

Meanwhile, Google also warned about the potential implications that the introduction of a code for removing online content could have for the freedom of speech.

In its submission, the company cited UN recommendations about how to ensure that freedom of expression was not curtailed by regulation which protected internet users from harmful content online.

“The protection and promotion of the right to freedom of expression is a fundamental public interest,” the company wrote.

“Google has always sought to appropriately promote the freedom of expression guarantee across its products.”

Like Facebook, Google highlighted issues about the introduction of regulations which would punish companies for how they responded to requests to remove content.

It warned that regulations which dealt with companies punitively could encourage companies to become less critical in how they removed content.

Instead, they suggested that companies may end up erring on the side of caution and could remove content without balancing the free-speech rights of the individuals who posted it originally.

Blanket monitoring

Meanwhile, the Irish Council for Civil Liberties (ICCL) submitted that any uncertainties about the moderation of content online should favour of freedom of expression, privacy, and data protection.

The group warned that the government’s proposals may not adhere to human rights laws, and said that social media companies were increasingly engaging in “platform censorship”, rather than harmful content removal.

The ICCL suggested the introduction of “self-appointed filtering” – whereby users would allow what they would see themselves – as a way to moderate content, rather than relying on companies to do so themselves.

“These designs are nascent and still exploratory but redirect the conversation to the importance of users deciding for ourselves what we want our internet and online platforms to look like,” ICCL added.

Responding to the submissions, Minister for Communications Richard Bruton thanked those who contributed and said he would bring draft heads of bill to Government after considering them.

He also said that better controls needed to be in place for moderating content online, and added that while he would listen to all views, he would not allow a system of self-regulation to continue.

“We will not accept a situation whereby online companies are permitted to regulate themselves,” he said. “That day is gone.”

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Author
Stephen McDermott
View 17 comments
Close
17 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds